Variance Competitiveness for Monotone Estimation: Tightening the Bounds

نویسنده

  • Edith Cohen
چکیده

Random samples are extensively used to summarize massive data sets and facilitate scalable analytics. Coordinated sampling, where samples of different data sets “share” the randomization, is a powerful method which facilitates more accurate estimation of many aggregates and similarity measures. We recently formulated a model of Monotone Estimation Problems (MEP), which can be applied to coordinated sampling, projected on a single item. MEP estimators can then be used to estimate sum aggregates, such as distances, over coordinated samples. For MEP, we are interested in estimators that are unbiased and nonnegative. We proposed variance competitiveness as a quality measure of estimators: For each data vector, we consider the minimum variance attainable on it by an unbiased and nonnegative estimator. We then define the competitiveness of an estimator as the maximum ratio, over data, of the expectation of the square to the minimum possible. We also presented a general construction of the L estimator, which is defined for any MEP for which a nonnegative unbiased estimator exists, and is at most 4-competitive. Our aim here is to obtain tighter bounds on the universal ratio, which we define to be the smallest competitive ratio that can be obtained for any MEP. We obtain an upper bound of 3.375, improving over the bound of 4 of the L* estimator. We also establish a lower bound of 1.44. The lower bound is obtained by constructing the optimally competitive estimator for particular MEPs. The construction is of independent interest, as it facilitates estimation with instance-optimal competitiveness.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On discrete a-unimodal and a-monotone distributions

Unimodality is one of the building structures of distributions that like skewness, kurtosis and symmetry is visible in the shape of a function. Comparing two different distributions, can be a very difficult task. But if both the distributions are of the same types, for example both are unimodal, for comparison we may just compare the modes, dispersions and skewness. So, the concept of unimodali...

متن کامل

A Berry-Esseen Type Bound for a Smoothed Version of Grenander Estimator

In various statistical model, such as density estimation and estimation of regression curves or hazard rates, monotonicity constraints can arise naturally. A frequently encountered problem in nonparametric statistics is to estimate a monotone density function f on a compact interval. A known estimator for density function of f under the restriction that f is decreasing, is Grenander estimator, ...

متن کامل

Competitive Boolean Function Evaluation: Beyond Monotonicity, and the Symmetric Case

We study the extremal competitive ratio of Boolean function evaluation. We provide the first non-trivial lower and upper bounds for classes of Boolean functions which are not included in the class of monotone Boolean functions. For the particular case of symmetric functions our bounds are matching and we exactly characterize the best possible competitiveness achievable by a deterministic algori...

متن کامل

The complete mixability and convex minimization problems with monotone marginal densities

Following the results in Rüschendorf and Uckelmann (2002), we introduce the completely mixable distributions on R and prove that distributions with monotone density and moderate mean are completely mixable. Using this method we solve the minimization problem minXi∼P Ef(X1 + · · ·+ Xn) for convex functions f and marginal distributions P with monotone density. Our results also provide valuable im...

متن کامل

The Structure of Bhattacharyya Matrix in Natural Exponential Family and Its Role in Approximating the Variance of a Statistics

In most situations the best estimator of a function of the parameter exists, but sometimes it has a complex form and we cannot compute its variance explicitly. Therefore, a lower bound for the variance of an estimator is one of the fundamentals in the estimation theory, because it gives us an idea about the accuracy of an estimator. It is well-known in statistical inference that the Cram&eac...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1406.6490  شماره 

صفحات  -

تاریخ انتشار 2014